Modify the linear search formula in the BFGS method to achieve global convergence.
نویسندگان
چکیده مقاله:
Nonlinear programming problems belong to the realm of commonly used optimization problems. In most cases, the objective function of such problems is non-convex. However, to guarantee global convergence in the algorithms proposed based on Newton's method to solve these problems, a convexity condition is generally required. Meanwhile, the quasi-Newton techniques are more popular because they use an approximation of the Hessian matrix or its inverse. However, in these algorithms, only gradient information is used to approximate this matrix. One of the most applicable quasi-Newton algorithms in solving nonlinear programming problems is the BFGS method. This paper presents a new idea for a linear search in the BFGS method. It proves that using this technique will lead to global convergence for general problems without the need for any additional conditions. Finally, the performance of the proposed algorithm is evaluated numerically.
منابع مشابه
the search for the self in becketts theatre: waiting for godot and endgame
this thesis is based upon the works of samuel beckett. one of the greatest writers of contemporary literature. here, i have tried to focus on one of the main themes in becketts works: the search for the real "me" or the real self, which is not only a problem to be solved for beckett man but also for each of us. i have tried to show becketts techniques in approaching this unattainable goal, base...
15 صفحه اولOn the Global Convergence of the BFGS Method for Nonconvex Unconstrained Optimization Problems
This paper is concerned with the open problem whether BFGS method with inexact line search converges globally when applied to nonconvex unconstrained optimization problems. We propose a cautious BFGS update and prove that the method with either Wolfe-type or Armijo-type line search converges globally if the function to be minimized has Lipschitz continuous gradients.
متن کاملGlobal convergence of online limited memory BFGS
Global convergence of an online (stochastic) limited memory version of the Broyden-FletcherGoldfarb-Shanno (BFGS) quasi-Newton method for solving optimization problems with stochastic objectives that arise in large scale machine learning is established. Lower and upper bounds on the Hessian eigenvalues of the sample functions are shown to suffice to guarantee that the curvature approximation ma...
متن کاملthe investigation of research articles in applied linguistics: convergence and divergence in iranian elt context
چکیده ندارد.
from linguistics to literature: a linguistic approach to the study of linguistic deviations in the turkish divan of shahriar
chapter i provides an overview of structural linguistics and touches upon the saussurean dichotomies with the final goal of exploring their relevance to the stylistic studies of literature. to provide evidence for the singificance of the study, chapter ii deals with the controversial issue of linguistics and literature, and presents opposing views which, at the same time, have been central to t...
15 صفحه اولمنابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ذخیره در منابع من قبلا به منابع من ذحیره شده{@ msg_add @}
عنوان ژورنال
دوره 5 شماره 21
صفحات 37- 46
تاریخ انتشار 2019-12-22
با دنبال کردن یک ژورنال هنگامی که شماره جدید این ژورنال منتشر می شود به شما از طریق ایمیل اطلاع داده می شود.
میزبانی شده توسط پلتفرم ابری doprax.com
copyright © 2015-2023